Human Activity Recognition (HAR) using wearable sensors is an increasingly relevant area for applications in healthcare, rehabilitation, and human–computer interaction. However, publicly available datasets that provide multi-sensor, synchronized data combining inertial and orientation measurements are still limited. This work introduces a publicly available dataset for Human Activity Recognition, captured using wearable sensors placed on the chest, hands, and knees. Each device recorded inertial and orientation data during controlled activity sessions involving participants aged 20 to 70. A standardized acquisition protocol ensured consistent temporal alignment across all signals. The dataset was preprocessed and segmented using a sliding window approach. An initial baseline classification experiment, employing a Convolutional Neural Network (CNN) and Long-Short Term Memory (LSTM) model, demonstrated an average accuracy of 93.5% in classifying activities. The dataset is publicly available in CSV format and includes raw sensor signals, activity labels, and metadata. This dataset offers a valuable resource for evaluating machine learning models, studying distributed HAR approaches, and developing robust activity recognition pipelines utilizing wearable technologies.
Loading....